Block coordinate descent for smooth nonconvex constrained minimization

نویسندگان

چکیده

At each iteration of a Block Coordinate Descent method one minimizes an approximation the objective function with respect to generally small set variables subject constraints in which these are involved. The unconstrained case and simple were analyzed recent literature. In this paper we address problem block not and, moreover, they defined by global sets equations inequations. A general algorithm that quadratic models quadratric regularization over blocks is convergence complexity proved. particular, given tolerances $\delta>0$ $\varepsilon>0$ for feasibility/complementarity optimality, respectively, it shown measure $(\delta,0)$-criticality tends zero; number iterations functional evaluations required achieve $(\delta,\varepsilon)$-criticality $O(\varepsilon^2)$. Numerical experiments proposed used solve continuous version traveling salesman presented.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accelerated Stochastic Block Coordinate Gradient Descent for Sparsity Constrained Nonconvex Optimization

We propose an accelerated stochastic block coordinate descent algorithm for nonconvex optimization under sparsity constraint in the high dimensional regime. The core of our algorithm is leveraging both stochastic partial gradient and full partial gradient restricted to each coordinate block to accelerate the convergence. We prove that the algorithm converges to the unknown true parameter at a l...

متن کامل

Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization

We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f (x1 , . . . , xN ) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterates to a stationary point is shown when either f is pseudoconvex in every pair of coordinate ...

متن کامل

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (“AdaBoost problem”). We assume the input data defining the loss function is contained in a sparse m× n matrix A with at ...

متن کامل

Robust Block Coordinate Descent

In this paper we present a novel randomized block coordinate descent method for the minimization of a convex composite objective function. The method uses (approximate) partial second-order (curvature) information, so that the algorithm performance is more robust when applied to highly nonseparable or ill conditioned problems. We call the method Robust Coordinate Descent (RCD). At each iteratio...

متن کامل

SparseNet: Coordinate Descent With Nonconvex Penalties.

We address the problem of sparse selection in linear models. A number of nonconvex penalties have been proposed in the literature for this purpose, along with a variety of convex-relaxation algorithms for finding good solutions. In this article we pursue a coordinate-descent approach for optimization, and study its convergence properties. We characterize the properties of penalties suitable for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2022

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-022-00389-5